Guided tutorials for IBM Event Automation
This set of guided tutorials will take you through the key features of Event Automation.
This set of guided tutorials will take you through the key features of Event Automation.
Quickly create a demo Event Automation environment, including topics with live streams of events that you can use to try the tutorials.
When processing events we can use filter operations to select input events.
When processing events we can use transform operations to refine input events.
Processing events over a time-window allows you to build a summary view of a situation which can be useful to identify overall trends.
Many interesting situations need us to combine multiple streams of events that correlate events across these inputs to derive a new, interesting situation.
Once your event processing has identified a situation of interest, a common next step is to automate a response. You can write the output of your processing flows to a Kafka topic to achieve this.
Publish the results of your event processing to the Catalog to allow them to be shared and reused by others.
Learn about how Event Processing can help you transform and act on event data.
Enriching a stream of events with static reference data from a database can increase the business relevance of events.
Writing custom filter expressions that use dynamically generated properties added to events can help to identify specific situations.
Aggregate processors can identify events that are interesting if they occur multiple times within a time window.
Some systems offer at-least-once assurances, and will occasionally generate duplicate events. Processing these to remove duplicates can enable consumption by systems that cannot behave idempotently.
Identifying types of events that occur most frequently within a time window is a useful way to detect trends.
Events generated from a wide range of producers can be out of sequence on the topic, making it important to resolve this before time-sensitive processing.
You can use App Connect to trigger automations and notifications from event processing results.
Handling complex events with nested properties and arrays to identify a situation and extract only the necessary information.
Find out how to monitor Flink with Prometheus and setup Grafana.
Connect to a schema registry to process a stream of events with formats that change over time.
Unpack each array element into separate events or into new properties in the same event to process the array content.
Detect key and headers for your Kafka topic messages and define them as properties in event source.
Find out how to get more out of Event Streams with these practical tutorials.
Set up Prometheus to monitor your Event Streams installations and visualize the data through Grafana.
Monitor the health of your cluster by using Datadog to capture Kafka broker JMX metrics.
Monitor the health of your cluster by using Splunk to capture Kafka broker JMX metrics.
Monitor the health of your cluster by using Splunk to capture Kafka broker JMX metrics.
Receive notifications about the health of your cluster based on monitored metrics.
See an example of setting up a multizone Event Streams in a non-zone-aware cluster.